High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)
نویسندگان
چکیده
Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. The computing challenges require adopting new strategies in algorithms, software, and hardware at multiple levels in the HEP computational pyramid. A significant issue is the human element – the need for training a scientific and technical workforce that can make optimum use of state-of-the-art computational technologies and be ready to adapt as the landscape changes. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers – 1) software effectiveness, and 2) infrastructure and expertise advancement. These drivers had been identified in a number of previous studies, including the 2013 HEP Topical Panel on Computing, the 2013 Snowmass Study, and the 2014 P5 report. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. A choice was made to focus on offline computing in HEP experiments, even though there can be nontrivial connections between offline and online computing. This document begins with a summary of the main conclusions and directions contained in the three reports, as well as a statement of the cross-cutting themes that emerge from them. Because the scope of HEP computing is so wide, it was impossible to give every technical area its due in the necessarily finite space of the individual reports. By covering some computational activities in more detail than others, the aim has been to convey the key points that are independent of the individual research projects or science directions. The three main reports follow in order after the summary. The Applications Software Working Group undertook a survey of members of the HEP community to ensure a broad perspective in the report. Albeit not a complete sample of the HEP community, the respondents covered a range of experiments and projects. Several dozens of applications were discussed in the responses. This mass of information helped to identify some of the current strengths and weaknesses of the HEP computing effort. A number of conclusions have emerged from the reports. These include assessments of the current software base, consolidation and management of software packages, sharing of libraries and tools, reactions to hardware evolution (including storage and networks), and possibilities of exploiting new computational resources. The important role of schools and training programs in increasing awareness of modern software practices and computational architectures was emphasized. A thread running across the reports relates to the difficulties in establishing rewarding career paths for HEP computational scientists. Given the scale of modern software development, it is important to recognize a significant community-level software commitment as a technical undertaking that is on par with major detector R&D. Conclusions from the reports have ramifications for how computational activities are carried out across all of HEP. A subset of the conclusions have helped identify initial actionable items for HEP-FCE activities, with the goal of producing tangible results in finite time to benefit large fractions of the HEP community. These include applications of next-generation architectures, use of HPC resources for HEP experiments, data-intensive computing (virtualization and containers), and easy-to-use production-level wide area networking. A significant fraction of this work involves collaboration with DOE ASCR facilities and staff.
منابع مشابه
Updated core libraries of the ALPS project
The open source ALPS (Algorithms and Libraries for Physics Simulations) project provides a collection of physics libraries and applications, with a focus on simulations of lattice models and strongly correlated systems. The libraries provide a convenient set of well-documented and reusable components for developing condensed matter physics simulation code, and the applications strive to make co...
متن کاملDivorcing Language Dependencies from a Scientific Software Library
In scientific programming, the never-ending push to increase fidelity, flops, and physics is hitting a major barrier: scalability. In the context of this paper, we do not mean the run-time scalability of code on processors, but implementation scalability of numbers of people working on a single code. With the kinds of multi-disciplinary, multi-physics, multiresolution applications that are here...
متن کاملA Survey of High-Quality Computational Libraries and Their Impact in Science and Engineering Applications
Recently, a number of important scientific and engineering problems have been successfully studied and solved by means of computational modeling and simulation. Many of these computational models and simulations benefited from the use of available software tools and libraries to achieve high performance and portability. In this article, we present a reference matrix of the performance of robust...
متن کاملAgents in bioinformatics, computational and systems biology
The adoption of agent technologies and multi-agent systems constitutes an emerging area in bioinformatics. In this article, we report on the activity of the Working Group on Agents in Bioinformatics (BIOAGENTS) founded during the first AgentLink III Technical Forum meeting on the 2nd of July, 2004, in Rome. The meeting provided an opportunity for seeding collaborations between the agent and bio...
متن کاملComputational proteomics pitfalls and challenges: HavanaBioinfo 2012 workshop report.
The workshop "Bioinformatics for Biotechnology Applications (HavanaBioinfo 2012)", held December 8-11, 2012 in Havana, aimed at exploring new bioinformatics tools and approaches for large-scale proteomics, genomics and chemoinformatics. Major conclusions of the workshop include the following: (i) development of new applications and bioinformatics tools for proteomic repository analysis is cruci...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1510.08545 شماره
صفحات -
تاریخ انتشار 2015